synapse weight
The Inescapable Conclusion: Machine Learning Is Not Like Your Brain - KDnuggets
The final article in this nine-part series summarizes the many reasons why Machine Learning is not like your brain - along with a few similarities. Hopefully, these articles have helped to explain the capabilities and limitations of biological neurons, how these relate to ML, and ultimately what will be needed to replicate the contextual knowledge of the human brain, enabling AI to attain true intelligence and understanding. In examining Machine Learning and the biological brain, the inescapable conclusion is that ML is not very much like a brain at all. In fact, the only similarity is that a neural network consists of things called neurons connected by things called synapses. Otherwise, the signals are different, the timescale is different, and the algorithms of ML are impossible in biological neurons for a number of reasons.
Machine Learning is Not Like Your Brain Part Seven: What Neurons are Good At - KDnuggets
In my undergraduate days, telephone switching was transitioning from electromechanical relays to transistors, so there were a lot of cast-off telephone relays available. Along with some of my cohorts at Electrical Engineering, we built a computer out of telephone relays. The relays we used had a switching delay of 12ms -- that is, when you put power to the relay, the contacts would close 12ms later. Interestingly, this is in the same timing range as the 4ms maximum firing rate of neurons. We also acquired a teletype machine which used a serial link running at 110 baud or about 9ms per bit.
Continual One-Shot Learning of Hidden Spike-Patterns with Neural Network Simulation Expansion and STDP Convergence Predictions
Lightheart, Toby, Grainger, Steven, Lu, Tien-Fu
This paper presents a constructive algorithm that achieves successful one-shot learning of hidden spike-patterns in a competitive detection task. It has previously been shown (Masquelier et al., 2008) that spike-timing-dependent plasticity (STDP) and lateral inhibition can result in neurons competitively tuned to repeating spike-patterns concealed in high rates of overall presynaptic activity. One-shot construction of neurons with synapse weights calculated as estimates of converged STDP outcomes results in immediate selective detection of hidden spike-patterns. The capability of continual learning is demonstrated through the successful one-shot detection of new sets of spike-patterns introduced after long intervals in the simulation time. Simulation expansion (Lightheart et al., 2013) has been proposed as an approach to the development of constructive algorithms that are compatible with simulations of biological neural networks. A simulation of a biological neural network may have orders of magnitude fewer neurons and connections than the related biological neural systems; therefore, simulated neural networks can be assumed to be a subset of a larger neural system. The constructive algorithm is developed using simulation expansion concepts to perform an operation equivalent to the exchange of neurons between the simulation and the larger hypothetical neural system. The dynamic selection of neurons to simulate within a larger neural system (hypothetical or stored in memory) may be a starting point for a wide range of developments and applications in machine learning and the simulation of biology.
- Oceania > Australia (0.14)
- North America > United States > California > San Francisco County > San Francisco (0.14)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.14)
- Europe > Germany (0.14)
- Health & Medicine > Therapeutic Area > Neurology (0.46)
- Energy > Oil & Gas > Upstream (0.34)
Pulse-Firing Neural Chips for Hundreds of Neurons
Brownlow, Michael, Tarassenko, Lionel, Murray, Alan F., Hamilton, Alister, Han, Il Song, Reekie, H. Martin
U niv. of Edinburgh ABSTRACT We announce new CMOS synapse circuits using only three and four MOSFETsisynapse. Neural states are asynchronous pulse streams, upon which arithmetic is performed directly. Chips implementing over 100 fully programmable synapses are described and projections to networks of hundreds of neurons are made. 1 OVERVIEW OF PULSE FIRING NEURAL VLSI The inspiration for the use of pulse firing in silicon neural networks is clearly the electrical/chemical pulse mechanism in "real" biological neurons. Neurons fire voltage pulses of a frequency determined by their level of activity but of a constant magnitude (usually 5 Volts) [Murray,1989a]. As indicated in Figure 1, synapses perform arithmetic directly on these asynchronous pulses, to increment or decrement the receiving neuron's activity.
- Europe > United Kingdom (0.15)
- North America > United States > Oregon > Multnomah County > Portland (0.04)
Pulse-Firing Neural Chips for Hundreds of Neurons
Brownlow, Michael, Tarassenko, Lionel, Murray, Alan F., Hamilton, Alister, Han, Il Song, Reekie, H. Martin
Oxford OX1 3PJ Edinburgh EH9 3JL U niv. of Edinburgh ABSTRACT We announce new CMOS synapse circuits using only three and four MOSFETsisynapse. Neural states are asynchronous pulse streams, upon which arithmetic is performed directly. Chips implementing over 100 fully programmable synapses are described and projections to networks of hundreds of neurons are made. 1 OVERVIEW OF PULSE FIRING NEURAL VLSI The inspiration for the use of pulse firing in silicon neural networks is clearly the electrical/chemical pulse mechanism in "real" biological neurons. Neurons fire voltage pulses of a frequency determined by their level of activity but of a constant magnitude (usually 5 Volts) [Murray,1989a]. As indicated in Figure 1, synapses perform arithmetic directly on these asynchronous pulses, to increment or decrement the receiving neuron's activity.
- Europe > United Kingdom (0.55)
- North America > United States > Oregon > Multnomah County > Portland (0.04)
Pulse-Firing Neural Chips for Hundreds of Neurons
Brownlow, Michael, Tarassenko, Lionel, Murray, Alan F., Hamilton, Alister, Han, Il Song, Reekie, H. Martin
Oxford OX1 3PJ Edinburgh EH9 3JL U niv. of Edinburgh ABSTRACT We announce new CMOS synapse circuits using only three and four MOSFETsisynapse. Neural states are asynchronous pulse streams, upon which arithmetic is performed directly. Chips implementing over 100 fully programmable synapses are described and projections to networks of hundreds of neurons are made. 1 OVERVIEW OF PULSE FIRING NEURAL VLSI The inspiration for the use of pulse firing in silicon neural networks is clearly the electrical/chemical pulse mechanism in "real" biological neurons. Neurons fire voltage pulses of a frequency determined by their level of activity but of a constant magnitude (usually 5 Volts) [Murray,1989a]. As indicated in Figure 1, synapses perform arithmetic directly on these asynchronous pulses, to increment or decrement the receiving neuron's activity.
- Europe > United Kingdom (0.55)
- North America > United States > Oregon > Multnomah County > Portland (0.04)